Relational Memory-Augmented Language Models

نویسندگان

چکیده

Abstract We present a memory-augmented approach to condition an autoregressive language model on knowledge graph. represent the graph as collection of relation triples and retrieve relevant relations for given context improve text generation. Experiments WikiText-103, WMT19, enwik8 English datasets demonstrate that our produces better in terms perplexity bits per character. also show relational memory improves coherence, is complementary token-based memory, enables causal interventions. Our provides simple yet effective way combine more coherent logical

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Reasoning with Memory Augmented Neural Networks for Language Comprehension

Hypothesis testing is an important cognitive process that supports human reasoning. In this paper, we introduce a computational hypothesis testing approach based on memory augmented neural networks. Our approach involves a hypothesis testing loop that reconsiders and progressively refines a previously formed hypothesis in order to generate new hypotheses to test. We apply the proposed approach ...

متن کامل

Relational Language and Relational Thought

Human cognitive abilities are remarkable. We easily go beyond what is perceptually available to reason about abstract systems. Our cognitive ability to adapt to a vast range of environments, and even to alter our environment to suit our desires, has given our species so great an advantage over other mammals that we are now poised to exterminate most of our former predators, and must use our ing...

متن کامل

Linguistically-augmented perplexity-based data selection for language models

Linguistically-augmented perplexity-based data selection for language models Antonio Toral a,∗, Pavel Pecina b, Longyue Wang c,1, Josef van Genabith d a School of Computing, Dublin City University, Dublin, Ireland b Faculty of Mathematics and Physics, Charles University in Prague, Czech Republic c Natural Language Processing & Portuguese-Chinese Machine Translation Laboratory, Department of Com...

متن کامل

Cache-Augmented Latent Topic Language Models for Speech Retrieval

We aim to improve speech retrieval performance by augmenting traditional N-gram language models with different types of topic context. We present a latent topic model framework that treats documents as arising from an underlying topic sequence combined with a cache-based repetition model. We analyze our proposed model both for its ability to capture word repetition via the cache and for its sui...

متن کامل

Requirements for Programming Language Memory Models

One of the goals of the designers of the Java programming language was that multithreaded programs written in Java would have consistent and well-defined behavior. This would allow Java programmers to understand how their programs might behave; it would also allow Java platform architects to develop their platforms in a flexible and efficient way, while still ensuring that Java programs ran on ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Transactions of the Association for Computational Linguistics

سال: 2022

ISSN: ['2307-387X']

DOI: https://doi.org/10.1162/tacl_a_00476